# Macedonian language optimization
MKLLM 7B Instruct
MKLLM-7B is an open-source large language model for the Macedonian language, built through continued pre-training on mixed Macedonian and English texts based on the Mistral-7B-v0.1 model.
Large Language Model
Transformers Supports Multiple Languages

M
trajkovnikola
31
8
XLMR BERTovski
A language model pretrained on large-scale Bulgarian and Macedonian texts, part of the MaCoCu project
Large Language Model Other
X
MaCoCu
36
0
Featured Recommended AI Models